1 - Machine Learning for Physicists [ID:52675]
50 von 764 angezeigt

Okay, I guess we have waited long enough.

Good evening, everyone.

Let's get started.

So, my name is Florian Markward.

I'm going to give this lecture series.

Last time you had my colleague Vittorio Piano give the kind of lecture zero introduction,

but today we are really going to start with a material.

So this is a lecture series that I gave for the first time in 2017, which was basically

when my own research group started to embark on this interface between machine learning

and physics, and that was really brand new back then.

I don't think that at the time there was any such lecture series in German universities.

Now, of course, it has changed very much because by now this topic has become even more prominent.

So one important remark I want to make is that in principle you can find videos of past

series, past performances on the internet, even on YouTube, but in this particular series

we're going to switch from one software framework to another software framework, so you will

not be able to reuse all the lectures.

So in particular we're going to switch to a framework that's called Jax.

I think Vittorio mentioned it, which is really perfect for physicists because it's basically

like the standard numerical calculations you would do as a physicist, and then on top of

that you have things like automatic differentiation, which really are at the core of neural networks.

So that's what is going to be part of this lecture series.

Now, we still have a few organizational things.

The first question, important question today is how do we want to start in the future so

we could start at 5 sharp, we could start at 5.15.

Is there a problem if we start at 5 sharp, so who has a problem maybe coming from another

lecture or something?

So that would be okay.

Then let's start at 5 sharp, then we are finished by 6.30, so that's also nice.

So start at 5 sharp in the future.

Then another announcement, probably you were already told, the first tutorial will actually

happen tomorrow on Friday, and it's going to take place at 4 o'clock.

Now again, I don't know whether sharp or not sharp, but I assume you could be there at

4 o'clock sharp, and it's going to take place, just like all the following tutorials, it's

going to take place at the Max Planck Institute for the Science of Light, which is where I'm

based, and this is just across the street, so to speak, across the parking lot, it's

this big black building, and you just have to enter in the main entrance, and there is

a kind of front office where you have to quickly register just because they don't want to have

anyone walk in, but that's just a formality, and then you walk up one stair, and there's

the big auditorium, so the biggest seminar room that we have, and there will be the tutorial,

so tomorrow at 4 o'clock it's also a good opportunity to visit our beautiful building.

But then there was a poll also for future tutorials, and according to what Vittorio

just told me today, the most agreeable time seems to be on Mondays, Monday evenings, so

that I believe is 5 to 7, but we will announce it also on the website, the official website.

That's that about the timing, is there a question in that direction right now? No question? Okay,

so then let's get started for real, finally. Today we are going to introduce the structure

of neural networks, and I think these are slides that Vittorio may have covered already,

so you think of your brain as an input-output machine, very simplified, but still a good

point of view to take in this context, and if you peek inside, you know that it's in

principle based on electrical signals traveling along these dendrites coming out of neurons,

so neurons are connected to neighboring neurons, so it's a very complicated network, but it's

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

01:25:23 Min

Aufnahmedatum

2024-04-25

Hochgeladen am

2024-04-26 12:59:05

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen